Error analysis for physics-informed neural networks (PINNs) approximating Kolmogorov PDEs
نویسندگان
چکیده
Abstract Physics-informed neural networks approximate solutions of PDEs by minimizing pointwise residuals. We derive rigorous bounds on the error, incurred PINNs in approximating a large class linear parabolic PDEs, namely Kolmogorov equations that include heat equation and Black-Scholes option pricing, as examples. construct networks, whose PINN residual (generalization error) can be made small desired. also prove total L 2 -error bounded generalization which turn is terms training provided sufficient number randomly chosen (collocation) points used. Moreover, we size samples only grow polynomially with underlying dimension, enabling to overcome curse dimensionality this context. These results enable us provide comprehensive error analysis for PDEs.
منابع مشابه
rodbar dam slope stability analysis using neural networks
در این تحقیق شبکه عصبی مصنوعی برای پیش بینی مقادیر ضریب اطمینان و فاکتور ایمنی بحرانی سدهای خاکی ناهمگن ضمن در نظر گرفتن تاثیر نیروی اینرسی زلزله ارائه شده است. ورودی های مدل شامل ارتفاع سد و زاویه شیب بالا دست، ضریب زلزله، ارتفاع آب، پارامترهای مقاومتی هسته و پوسته و خروجی های آن شامل ضریب اطمینان می شود. مهمترین پارامتر مورد نظر در تحلیل پایداری شیب، بدست آوردن فاکتور ایمنی است. در این تحقیق ...
Approximating Rough Stochastic Pdes
We study approximations to a class of vector-valued equations of Burgers type driven by a multiplicative space-time white noise. A solution theory for this class of equations has been developed recently in [Hairer, Weber, Probab. Theory Related Fields, 2013]. The key idea was to use the theory of controlled rough paths to give definitions of weak / mild solutions and to set up a Picard iteratio...
متن کاملDeep Jointly-Informed Neural Networks
In this work a novel, automated process for determining an appropriate deep neural network architecture and weight initialization based on decision trees is presented. The method maps a collection of decision trees trained on the data into a collection of initialized neural networks, with the structure of the network determined by the structure of the tree. These models, referred to as “deep jo...
متن کاملPhysics of Neural Networks
The material basis of our thinking, intelligence and creativity are 1013-1014 nerve cells (neurons) which in our brain are densely packed into a grey sub stance weighing about 1.5 kg. Each neuron has — like the root of a tree — highly branched dendrites that collect information from about 10000 other nerve cells. This huge network, which on a microscopic scale looks rather homogeneous and diso...
متن کاملComputational physics: Neural networks
2 Networks of binary neurons 5 2.1 Neural information processing is noisy . . . . . . . . . . . . . 5 2.2 Stochastic binary neurons and networks . . . . . . . . . . . . . 11 2.2.1 Parallel dynamics: Little model . . . . . . . . . . . . . 13 2.2.2 Sequential dynamics . . . . . . . . . . . . . . . . . . . 13 2.3 Some properties of Markov processes . . . . . . . . . . . . . . 14 2.3.1 Eigenvalue s...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Advances in Computational Mathematics
سال: 2022
ISSN: ['1019-7168', '1572-9044']
DOI: https://doi.org/10.1007/s10444-022-09985-9